Ultra-high spatial resolution sensors made available by advancements in the miniaturization of instruments deployable on uncrewed aerial vehicle (UAVs),1 present new and innovative opportunities for remote detection of marine wildlife. The spatio-temporal resolution and survey responsiveness afforded by these low-cost platforms enables the collection of data that can provide insights on the spatio-temporal dynamics of individual marine animals at close range (Anderson & Gaston, 2013). In the last decade there has been an increase in marine studies utilizing UAVs (Schofield et al., 2019), which allows for novel insights on the abundance (Hodgson et al., 2017), behavior (Fiori et al., 2020; Torres et al., 2018), and body condition (Christiansen et al., 2016; Hodgson et al., 2020) of marine wildlife. Importantly, UAV-based image capture has the potential to increase the duration of visible observation through detection of animals below the water surface (Torres et al., 2018). This has significant ramifications for the study of mother-calf humpback whale groups that rest in shallow protected embayments (Bruce et al., 2014; McCulloch et al., 2021), spending high proportions of time resting at depths of <5 m (Bejder et al., 2019; Iwata et al., 2021). However, the use of remotely sensed data in coastal environments is challenged by the optical complexity of the water column (Figure 1). Research until now has focused primarily on image detection based on the visual spectrum with limited evaluation of conventional remote sensing methods for enhancing observation of whales. Processing techniques are commonly used in remote sensing research to enhance or enable the detection of underwater features, including benthic habitat (Hedley et al., 2016; Mumby et al., 1998; Zoffoli et al., 2014). Similar techniques, including water column correction, can be applied to UAV-captured images to enhance visibility of animals below the water surface, which may be missed in manual counts or automated deep-learning based classifications (Gray et al., 2019). This paper presents remote sensing-based methods for enhancing UAV-acquired visual image data to improve the contrast of whales on the water surface and submerged near the surface. Application of these methods has the potential to reduce perception error (Colefax et al., 2018) and subsequently improve detection rates. The Jervis Bay Marine Park study site on the eastern Australian coastline is frequented by humpback whale (Megaptera novaeangliae) mother-calf groups during the southern migration from the breeding grounds (Bruce et al., 2014; Jones, 2019). The visible RGB imaging capabilities of sensors for detecting whales were evaluated using a DJI Mavic 2 Enterprise Dual (M2ED) launched from a small boat in Jervis Bay. Following a confirmed whale sighting, the whale's behavior and direction of travel were observed for 5 min from the boat at a distance >300 m before an approach was made to a distance >100 m from the whale. The UAV was launched to an initial altitude of 55 m to provide sufficient sensor field of view for whale identification and lowered to ≥25 m once whale(s) were visible on the controller screen. The boat remained at a distance >100 m from the whales during the flights to provide a visual line of sight to the UAV and facilitate positioning over the whales. Still images and videos were captured throughout each flight. Lyzenga's water column correction (Lyzenga, 1981) as modified by Mumby et al. (1998) and Hamylton (2011) was applied to three UAV images containing humpback whales (steps summarized in Figure 2). To enhance the spectral signature of the whale, radiance values were taken from transects along and across the whale's body surface to account for the fusiform shape of the whale (Step 3, Figure 2). This method, suitable for high clarity water, produces a depth-invariant band based on each pair of spectral (wavelength) bands (Mumby et al., 1998). This generated three depth invariant bands from the available spectral band pairs (red/green, red/blue, green/blue). To allow for comparison between the original image and processed images, all pixels were converted into Z-scores (Z = [pixel value − mean]/standard deviation), enabling detection of anomalies. The performance of the applied image processing methods was assessed visually (Figure 3), and quantitively by evaluating the mean Z-score values (Table 1) extracted across the whale's surface for the original RGB images and the three resulting band pairs. Application of Lyzenga's water column correction enhanced the contrast and edge definition between whales and surrounding water in the three UAV-captured images presented here. For whales on or just below the surface, the red/green and red/blue depth-invariant band pairs were the most effective band pair combination for enhancing contrast between the whale and surrounding waters, both visually (Figure 3A and C) and resulting in the highest anomaly values (Table 1). The effectiveness of the red band for animals on or near the surface is consistent with findings from Colefax et al. (2021), who demonstrated the red band produced the greatest spectral contrast for detecting dolphins, sharks, and other marine fauna from a UAV. At increasing depths, the green/blue band pair produced a stronger visual contrast and highest anomaly result (Figure 3B; Table 1) owing to the red waveband becoming attenuated at greater depth. The depth of the target animal(s) will influence the optimal band pairing and subsequent object enhancement within the image. The red band (610–700 nm), and other bands with wavelengths approaching the red end of the spectrum (e.g., red edge and near infrared), will become fully attenuated around 2–3 m depth (Hamylton, 2011; Mishra et al., 2005). Thus, the effectiveness of band pairs using the red band will be substantially reduced at this depth. However, in turbid waters, where sightability below the surface is limited, longer wavelength bands may be optimal for increasing the contrast of animals just below the surface (Colefax et al., 2021). Image enhancement methods for improving whale detection rates even in optically shallow waters (<2–3 m) has important implications for studies reliant on whale visibility, such as estimating abundance (e.g., Hodgson et al., 2017). This is key for research conducted on breeding/resting grounds, with lactating females and their calves spending over 50% of time within 3 m of the surface (Bejder et al., 2019). The methods presented here have the potential to increase the confidence of machine learning and automated detection models. Machine learning techniques for automated detection of whales are increasingly being applied to UAV-collected data sets providing an efficient way to process large amounts of data. Studies using machine learning to detect individual animals or plants rely on the detection of target objects against mostly homogenous backgrounds which typically rely on a clear contrast to reliably discriminate animals from their surrounding environment (Laliberte & Ripple, 2003). The use of machine learning for detecting marine vertebrates is further complicated by coloration similarities between some species and the surrounding water (e.g., blue whales; Gray et al., 2019) and nonuniform backgrounds that are subject to varying environmental conditions. In ecological studies, object-based image analysis (OBIA) is the most commonly used machine learning method (Dujon & Schofield, 2019). It provides versatility for detecting objects in varied backgrounds with confounding features, objects that vary in size and shape, and may be sparsely distributed in the image data set (Chabot et al., 2018), making it ideal for abundance and distribution studies. Importantly, even with increased flexibility over other machine learning methods, OBIA is still reliant on the object(s) of interest in the image being localized from surrounding pixels through a local brightness contrast, either as a relatively brighter or relatively darker group of image pixels (Groom et al., 2013). For whale detection, increasing the anomaly (Z-score) of the whale object, as demonstrated here, could potentially improve OBIA-derived results and increase detection availability. Here we have demonstrated that water column correction techniques can enhance the visual outline and anomaly signal over the body of whales on the surface and submerged at shallow depths. However, we recognize constraints associated with the limited spectral resolution provided when working with three bands (i.e., red, green, and blue) within the visible range (wavelength 400–700 nm). Employing multispectral or hyperspectral sensors and customizing or selecting bands configured to focus on wavelength bands optimal for the spectral characteristics of the water body, such as the coastal band (Fretwell et al., 2014), and target species may further reduce perception bias and subsequently improve animal availability at depth compared to standard RGB sensors (Colefax et al., 2018). Additionally, obtaining a larger image data set is critical for understanding the effectiveness of depth invariant indices derived from possible band pairs, and whether these differ with the whale depth and environmental factors (e.g., turbidity). Ultimately, an understanding of the role of water column attenuation correction is critical for accurate detection of whales and is particularly relevant with the increasing use of low-cost UAV platforms and machine learning techniques for achieving estimates of population abundance and monitoring movement patterns of marine wildlife in shallow coastal habitats. Alexandra Jones: Conceptualization; formal analysis; investigation; methodology; project administration; writing – original draft; writing – review and editing. Eleanor Bruce: Conceptualization; investigation; methodology; project administration; supervision; writing – review and editing. Kevin Davies: Conceptualization; formal analysis; methodology; supervision; writing – review and editing. Douglas Cato: Methodology; project administration; supervision; writing – review and editing. Thank you to Scott Sheehan for assistance in conducting the UAV surveys. VADAR software was designed by Eric Kniest who customized it for this research. And to Nathan Angelakis, Natasha Garner, Gabrielle Genty, Kennadie Haigh, Evie Hyland, David Lorieux, Lisa McComb, and Euan Smith for support in the field. This research was conducted under authorization by the University of Sydney Animal Ethic Committee (permit 2019/1592) and permits from the Department of Primary Industries Marine Parks (permit MEAA19/179) and the Department of Planning, Industry and Environment, New South Wales (SL102287). Under restrictions from the Australian Civil Aviation Authority (CASA) all UAV flights were within visual line of sight. Approval was obtained from the Department of Defence to fly UAVs in the Restricted Airspaces (R421A Nowra and R452 Beecroft Head) covering Jervis Bay. Open access publishing facilitated by The University of Sydney, as part of the Wiley - The University of Sydney agreement via the Council of Australian University Librarians.